AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient Inference with Small Parameters

# Efficient Inference with Small Parameters

Text2graph R1 Qwen2.5 0.5b
Apache-2.0
A text-to-graph information extraction model based on Qwen-2.5-0.5B, jointly trained through reinforcement learning (GRPO) and supervised learning.
Knowledge Graph English
T
Ihor
199
20
Granite 3b Code Base 2k
Apache-2.0
Granite-3B-Code-Base-2K is a decoder-only model developed by IBM Research specifically designed for code generation tasks, featuring 3B parameters and supporting 116 programming languages.
Large Language Model Transformers Other
G
ibm-granite
711
37
Croissantllmbase
MIT
CroissantLLM is a 1.3 billion parameter language model pre-trained on 3 trillion English-French bilingual tokens, designed to provide high-performance, fully open-source bilingual models for the research and industrial communities.
Large Language Model Transformers Supports Multiple Languages
C
croissantllm
901
32
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase